• Friday, September 27, 2024

    In a recent interview with Ben Thompson on Stratechery, Meta's Chief Technical Officer Andrew Bosworth discussed the company's advancements in augmented reality (AR) and virtual reality (VR), particularly focusing on the Orion AR glasses and the broader implications for Meta Reality Labs. Bosworth, who has been with Meta since its early days, shared insights into his journey within the company, the development of key products, and the competitive landscape with companies like Apple. Bosworth praised the recent Meta Connect 2024 keynote, highlighting CEO Mark Zuckerberg's engaging presentation style and the impressive demonstrations of new technologies. He specifically noted the Orion AR glasses, which, despite not being a consumer product yet, showcased remarkable capabilities. The glasses, which reportedly cost around $10,000 each, were described as comfortable and transformative, offering a field of view that made the experience feel more immersive than other devices on the market, including Apple's Vision Pro. Bosworth emphasized that the Orion glasses could potentially replace smartphones, marking a significant shift in how users interact with technology. The interview delved into the technical aspects of Orion, explaining that it operates with a separate computing unit, referred to as a "puck," which connects wirelessly to the glasses. Bosworth acknowledged challenges in user input methods but expressed optimism about the device's potential. He also discussed the competitive dynamics between Meta and Apple, noting that while Meta has a larger developer ecosystem, Apple excels in manufacturing and could pose a significant challenge in scaling production. Reflecting on his career, Bosworth recounted his early experiences in technology, including his time at Harvard where he taught Zuckerberg. He shared anecdotes about the development of Facebook's News Feed and the controversies surrounding it, illustrating how user feedback shaped the platform's evolution. Bosworth also addressed the internal discussions at Meta regarding the direction of Reality Labs, emphasizing the importance of focusing on user experience and the need for a clear vision in product development. The conversation touched on the bifurcation between AR and VR, with Bosworth explaining how AI is becoming a crucial component in enhancing user experiences across both domains. He highlighted the potential for AI to revolutionize content creation and user interaction, particularly in Horizon Worlds, Meta's virtual environment. Bosworth also discussed the partnership with Ray-Ban, which has helped to create stylish and wearable AR glasses, emphasizing the importance of aesthetics in technology adoption. He expressed confidence in Meta's position in the market, viewing competition with Apple as a healthy dynamic that could drive innovation and investment in the AR/VR space. In conclusion, Bosworth conveyed a sense of pride in the advancements made at Meta, particularly with Orion, and expressed optimism about the future of AR and VR technologies. He acknowledged the challenges ahead but remained committed to delivering innovative products that could redefine user interaction with technology.

  • Thursday, September 26, 2024

    Meta has unveiled a new prototype for augmented reality (AR) glasses, named Orion, which signifies a shift from the company's previous focus on bulky virtual reality (VR) headsets. During the Meta Connect keynote, CEO Mark Zuckerberg showcased these lightweight glasses, weighing only 100 grams, as a glimpse into the future of AR technology. The Orion prototype aims to provide a more comfortable and practical alternative to existing VR devices, which tend to be heavier and less user-friendly. The design of the Orion glasses emphasizes the need for them to be lightweight and resemble traditional eyewear, avoiding the bulkiness associated with VR headsets like the Meta Quest 3. To achieve this, some processing is offloaded to a small wireless "puck" that connects to the glasses, allowing for a more streamlined design. The glasses utilize innovative microprojection technology, where tiny projectors embedded in the arms of the glasses project images into specially designed waveguides. This technology enables the display of holographic images that can be layered over the real world, providing a true augmented reality experience rather than just a passthrough view. Zuckerberg highlighted the challenges of ensuring that the projected images are sharp and bright enough to be visible in various lighting conditions. The Orion glasses boast a field of view of 70 degrees, which is larger than that of competitors like Microsoft's Hololens 2 and Magic Leap One. Users can interact with the holograms through voice commands, hand gestures, and eye tracking, but a notable feature is the "neural interface" wristband. This wristband can detect subtle wrist and finger movements, allowing users to control the AR experience without needing to speak or make large gestures. Overall, the Orion prototype represents Meta's ambition to redefine the AR landscape, moving towards a future where augmented reality is seamlessly integrated into everyday life through lightweight and user-friendly devices.

  • Thursday, September 26, 2024

    Mark Zuckerberg envisions a future where augmented reality (AR) glasses, specifically Meta's Orion, will replace smartphones as the primary computing device. During an interview at Meta Connect, he discussed the long development journey of Orion, which has been in the works for nearly a decade. Initially intended as a consumer product, the glasses have evolved into a sophisticated demo due to production costs and technical challenges. Zuckerberg expressed confidence that AR glasses represent the next major platform shift, akin to the transition from desktop to mobile. The partnership with EssilorLuxottica, the eyewear conglomerate behind Ray-Ban, is pivotal for Meta's strategy. Zuckerberg believes that this collaboration could replicate the success Samsung had in the smartphone market, positioning Meta to tap into a potentially massive market for smart glasses. The current iteration of Ray-Ban smart glasses has seen early success, indicating a consumer appetite for stylish, functional eyewear that integrates technology without overwhelming users. Zuckerberg's demeanor during the interview reflected a newfound confidence and a willingness to engage in self-reflection regarding Meta's past controversies, including its role in political discourse and social media's impact on mental health. He acknowledged the challenges of navigating public perception and emphasized a desire for Meta to adopt a nonpartisan stance moving forward. The conversation also touched on the integration of AI into the glasses, enhancing their functionality and user experience. Zuckerberg believes that as AI capabilities grow, users will increasingly rely on glasses for tasks traditionally performed on smartphones, leading to a gradual shift in how people interact with technology. Zuckerberg's insights suggest that while smartphones will not disappear immediately, AR glasses will become more integral to daily life, allowing users to engage with digital content in a more immersive and seamless manner. He anticipates that as technology advances, the glasses will evolve to meet consumer needs, ultimately reshaping the landscape of personal computing.

  • Monday, September 9, 2024

    Apple is reportedly considering developing non-AR smart glasses to rival the Meta Ray-Ban glasses. It was previously renewing its efforts to build augmented reality smart glasses but is now considering less ambitious options. Removing AR features should result in a cheaper bill of materials and less complex development, allowing Apple to bring them to market much faster than AR-equipped specs. Samsung is also working on a pair of smart glasses, expected to be powered by Google software.

  • Thursday, October 3, 2024

    Samsung is reportedly developing a competitor to the Ray-Ban Meta glasses, collaborating with Google on the project that will utilize Google’s Gemini AI technology. This initiative was approved earlier in the year following extensive discussions within Google about whether to pursue full augmented reality (AR) glasses or simpler smart glasses akin to the Ray-Ban Meta model. Ultimately, executives from both companies decided to focus on the latter option. This development comes nearly a year after Samsung filed a trademark for "Samsung Glasses" in the UK, hinting at a potential product name. In the competitive landscape of smart eyewear, Google is also aiming to secure the partnership with EssilorLuxottica, the company that owns Ray-Ban, which would be a significant advantage in the market. EssilorLuxottica holds a dominant position in the global eyewear sector, with its brands being some of the most recognized worldwide. Despite Google's efforts to take this partnership from Meta, they were unsuccessful, as Meta and EssilorLuxottica recently announced an extension of their collaboration for the next decade to create multi-generational smart eyewear products. In addition to the smart glasses project, Samsung and Google are working together on a high-end mixed reality headset designed to compete with Apple's Vision Pro. Google is responsible for the software aspect, while Samsung is handling the hardware, utilizing Qualcomm's XR2+ Gen 2 chipset. However, there are indications that the release of this mixed reality headset may face further delays, potentially pushing it into 2025. Overall, the collaboration between Samsung and Google signifies a strategic move in the evolving market of augmented and mixed reality technologies, as both companies seek to carve out their share in a space that is becoming increasingly competitive.

  • Thursday, June 20, 2024

    Snap unveiled a real-time, on-device image diffusion model for creating vivid AR experiences and introduced new generative AI tools for AR creators at the Augmented World Expo. These enhancements, which include a new Lens Studio 5.0 with an AI assistant, aim to make AR content creation faster and more efficient with features like realistic ML face effects, 3D asset generation, and character creation using text prompts.

  • Friday, March 29, 2024

    Meta is introducing AI features to its Ray-Ban smart glasses, including translation, and the identification of objects, animals, and monuments, with activation through voice commands.

  • Tuesday, April 23, 2024

    Meta plans to partner with external hardware companies to build virtual reality headsets using its Meta Horizon operating system. The move will result in new hardware devices that run on the same operating system and software as Meta's current first-party virtual reality hardware. It will recreate the Android versus iOS dynamic in smartphones with virtual reality headsets. Apple will soon have to compete against a range of different hardware devices at various prices, all running on Meta's operating system.

  • Monday, September 23, 2024

    Meta smart glasses have succeeded where other AI wearables and smart glasses haven't and even beyond Meta's own expectations. They're expensive, but affordable compared to an Apple Vision Pro or a Humane Pin, and they have good quality speakers, microphones, and cameras. The AI is sometimes finicky and inelegant, but it works in a natural way. The device easily slots into people's lives now, with no future software update to wait for.

  • Monday, July 22, 2024

    Google has approached Ray-Ban maker EssilorLuxottica about putting its Gemini assistant on future smart glasses. Meta was recently in advanced talks with EssilorLuxottica about acquiring a five percent stake in the company. It is unlikely that Meta will lose its partnership with EssilorLuxottica to Google. Google is developing an XR platform and has partnered with Samsung and Qualcomm for a device. It previously partnered with Luxottica prior to its merger with Essilor to make Google Glass.

  • Tuesday, August 27, 2024

    Snap is reportedly set to unveil a new generation of augmented reality Spectacles featuring improved technology like a wider field of view and better battery life. These glasses remain developer-focused with limited production, reflecting Snap's continued interest in hardware despite previous setbacks.

  • Wednesday, April 24, 2024

    Meta has rolled out multimodal AI to its Ray-Ban Meta Smart Glasses, allowing users to process photos, audio, and text through voice commands for tasks like identification and translation, although the AI's capabilities are limited and sometimes inaccurate.

  • Thursday, August 1, 2024

    Meta has introduced the Layout app for the Quest 3 headset. This utility allows users to measure and visualize changes in their homes, place virtual objects, and use a decor leveling tool. The update also integrates an experimental AI assistant, previously available on Ray-Ban smart glasses, that answers environment-related questions. Additional features include the ability to download multiple games simultaneously, pair Touch controllers from the headset, and manage audio levels.

  • Thursday, April 25, 2024

    Mark Zuckerberg spent almost the entirety of his opening remarks during Meta's earnings call focused on the many ways the company loses money. This resulted in the company's shares dropping by as much as 19% in extended trading on Wednesday, despite Meta reporting better-than-expected profit and revenue for the first quarter. This article discusses the earning call in detail. Topics mentioned on the call include Meta's plans for turning its AI investment into ad dollars, Llama 3, potential opportunities for expansion within the mixed reality headset market, and Meta's AR glasses.

  • Tuesday, June 4, 2024

    Google and Magic Leap have formed a strategic partnership to create immersive augmented reality experiences, signaling Google's potential return to the AR/VR market. Magic Leap has previously been uncertain about a consumer AR device. Google's ongoing collaboration with Samsung on mixed reality technologies remains unaffected.

  • Tuesday, May 28, 2024

    Looking Glass has introduced a 32-inch spatial display and a 16-inch OLED variant designed to enable group 3D visualization without the need for headsets. The displays can assist in the development and presentation of interactive 3D digital images, videos, and applications in real time. They support a range of software through plugins for Unity, Unreal, Blender, and WebXR. Looking Glass provides a 3D model importer and an SDK for developing custom 3D/holographic content. The new displays are thin and designed to be easily installed anywhere.

  • Monday, July 22, 2024

    Meta has updated its Quest mobile app, integrating it more closely with its 3D social platform, Horizon Worlds. Now rebranded as Meta Horizon, the app, which includes a new light mode, allows users to complete quests, customize avatars, explore new worlds, and connect with friends. Meta is expanding Horizon Worlds and VR industry reach by licensing its headset OS to companies like Lenovo and Asus and featuring more experimental App Lab titles.

  • Friday, October 4, 2024

    Meta has confirmed that it may utilize any images users request Ray-Ban Meta AI to analyze for training its artificial intelligence systems. Initially, the company was reticent to provide details on this matter, but further clarification has emerged. According to Emil Vazquez, a policy communications manager at Meta, images and videos shared with Meta AI in regions where multimodal AI is available, such as the U.S. and Canada, can be used to enhance the AI's capabilities in accordance with their Privacy Policy. The distinction is made that while photos and videos taken with Ray-Ban Meta smart glasses are not used for training unless submitted to the AI, once a user requests an analysis, those images are subject to different policies. This means that users inadvertently contribute to a growing database that Meta can leverage to refine its AI models. The only way to avoid this is to refrain from using the AI features altogether. This situation raises significant privacy concerns, as users may not fully grasp that they are providing Meta with potentially sensitive images, which could include personal spaces or identifiable individuals. Although Meta asserts that this process is made clear in the user interface, there seems to have been a lack of initial transparency from the company regarding these practices. Previously, it was known that Meta trains its Llama AI models on publicly available data from platforms like Instagram and Facebook, but this new approach extends that definition to include any images analyzed through the smart glasses. The timing of this revelation is particularly pertinent, as Meta has recently introduced new AI features that encourage users to interact with the AI in a more intuitive manner, increasing the likelihood of sharing new data for training purposes. A notable addition is a live video analysis feature that streams images to Meta’s AI models, allowing users to receive outfit suggestions based on their wardrobe. However, the company does not prominently disclose that these images are also being sent to Meta for training purposes. Meta's privacy policy explicitly states that interactions with AI features can be utilized for training AI models, which encompasses images shared through the Ray-Ban smart glasses. Furthermore, the terms of service indicate that by sharing images, users consent to Meta analyzing those images, including facial features. This situation is compounded by Meta's recent legal history, having settled a $1.4 billion lawsuit in Texas concerning its facial recognition practices. The case revolved around a feature called "Tag Suggestions" on Facebook, which was made opt-in after significant backlash. Notably, some of Meta AI's image features are not available in Texas due to these legal issues. Additionally, Meta retains transcriptions of voice interactions with Ray-Ban Meta by default to train future AI models, although users can opt out of having their voice recordings used for this purpose when they first log into the app. The broader context involves a trend among tech companies, including Meta and Snap, to promote smart glasses as a new computing platform. These devices, equipped with cameras, raise privacy concerns reminiscent of the issues surrounding Google Glass. Reports have surfaced of individuals hacking Ray-Ban Meta glasses to access personal information about those they encounter, further highlighting the potential risks associated with this technology.

  • Tuesday, July 23, 2024

    Meta has filed a patent for a feature similar to Apple's EyeSight. The feature displays a virtual image of the user's eyes and may include health sensors in the face interface, differentiating it from Apple's version. Patent law permits companies to patent their methods, allowing Meta to develop this feature despite its similarity to EyeSight, which Apple may drop in a more affordable Vision model due to its lackluster reception.

  • Thursday, October 3, 2024

    In a Reddit community dedicated to experienced developers, a user shared a compelling story about a challenging bug investigation they undertook while working at a company that specialized in augmented reality devices for industrial applications. The user, who had a diverse skill set in software engineering, was tasked with diagnosing a peculiar bug related to the pose prediction code, which was crucial for rendering AR objects accurately based on user movements. The issue was subtle and occurred infrequently, making it difficult to detect. It manifested as visual glitches that were only noticeable to human users, occurring once a week for about 70% of the devices in use. This was particularly concerning because the devices were worn by industrial workers engaged in tasks that required high levels of focus and balance, posing a risk of physical harm. The investigation was complicated by the system's intricate sensor and data flow, which made it challenging to introduce additional monitoring without affecting performance. The user resorted to setting up robotic arms, lasers, and a high-speed camera to gather objective data on the projection system. Through this method, they discovered that the bug consistently appeared on Wednesdays, leading them to investigate the time settings of the devices. The breakthrough came when the user realized that the production devices were primarily set to Pacific Standard Time (PST), while many development models operated on different time zones, including Austrian time or UTC. This discrepancy in time settings was linked to the language settings of the embedded operating systems, which were predominantly set to German. The user found that the code responsible for handling timestamps was flawed, particularly in how it translated day-of-week words between German and English. The root cause of the issue was traced back to a clever but ultimately misguided coding approach by a computer vision researcher. The researcher had implemented a system that sent timestamps in a format that included the day of the week in German. However, the code that translated these timestamps to English was not robust enough to handle all cases, particularly for Wednesdays. This led to a situation where the system misinterpreted the timestamps, causing the pose prediction system to behave erratically and creating a dangerous situation for users. The investigation revealed that the recovery code, which was intended to correct the discrepancies, was poorly designed and did not log useful information, making it difficult to identify the underlying problem. The result was a complex interplay of factors that culminated in a significant risk to user safety, all stemming from a combination of language settings and a flawed timestamp handling mechanism. Ultimately, the user’s detailed account highlights the challenges faced in software development, particularly in high-stakes environments where precision is critical. The story serves as a cautionary tale about the importance of thorough testing and the potential consequences of seemingly minor oversights in code.

  • Wednesday, May 15, 2024

    Google announced many new features, including Gemini Flash, Veo video generation, Imagen 3, and its newest assistant, Project Astra, at I/O 2024. In all, there are an impressive number of improvements, including 2m token context length, dramatically cheaper models, and improved multimodality.

  • Friday, June 28, 2024

    Snap has released the GenAI suite in Lens Studio 5.0 - this is a great advancement and is massively useful for developing AR applications.

  • Tuesday, March 19, 2024

    Headspace XR is a mixed and virtual reality experience created by Headspace, Meta, and Nexus Studios. It targets Gen Z users to strengthen their mind-body connection and alleviate loneliness. However, like with any innovation or new technology, there’s a danger that this becomes just another complicated distraction.

  • Tuesday, April 23, 2024

    There's been a shift in attention from the metaverse to the Apple Vision Pro. Brands have been testing immersive experiences and integrating them into users' real-world environments. Companies like e.l.f. Beauty have been seeing 50% higher conversion rates on their Vision Pro apps compared to their mobile apps. Wayfair, Lowe's, and J.Crew have released apps for Vision Pro that let users experience products through the headset. Any kind of experience designed for the headset is helping brands stand out at this point.

  • Monday, May 27, 2024

    Augmented reality has become a tempting marketing tool for brands tired of shoppers seeing their products as static offerings on store shelves. Research suggests that these phone-based visual experiences can help drive sales, particularly for smaller brands with a more narrow audience in mind. Depending on the approach, AR campaigns can help shoppers pick between product varieties or even boost shopper loyalty with email capture or special offers.

  • Friday, May 17, 2024

    Meta is reportedly working on AI-powered earphones equipped with cameras. Internally codenamed 'Camerabuds', the earphones will leverage AI capabilities for real-time object identification and foreign language translation. Meta's leadership sees AI-powered earphones as the next logical step in the evolution of wearable technology. It has partnered with Kansas-based electronics company Ear Micro to explore the possibilities of this emerging technology.

  • Tuesday, May 14, 2024

    Meta is reportedly working on AI-powered earphones equipped with cameras. Internally codenamed 'Camerabuds', the earphones will leverage AI capabilities for real-time object identification and foreign language translation. Meta's leadership sees AI-powered earphones as the next logical step in the evolution of wearable technology. It has partnered with Kansas-based electronics company Ear Micro to explore the possibilities of this emerging technology.

  • Monday, June 24, 2024

    The MX Ink stylus, Logitech's first mixed reality tool for Meta Quest 2 and 3 headsets, offers a natural, pencil-like feel for 3D content creation. It has features like swappable pressure-sensitive tips and haptic feedback, which enhance the creative process for AR and VR artists. Despite its $129.99 price tag and lack of Meta Quest Pro support, its release later this year could be an excellent addition to advancing 3D design.

  • Wednesday, May 15, 2024

    Google introduced Project Starline, a holographic interface that utilizes advanced 3D imaging, artificial intelligence, and new display technologies, at its I/O conference in 2021. The platform aims to bridge the gap between physical and virtual interactions, allowing users to feel like they are in a face-to-face meeting. The concept is heading towards practical use. Google announced it will work with HP to sell and integrate video-conferencing technology into platforms such as Google Meet and Zoom in 2025.